Computational Developments of ψ-learning

نویسندگان

  • Sijin Liu
  • Xiaotong Shen
  • Wing Hung Wong
چکیده

One central problem in science and engineering is predicting unseen outcome via relevant knowledge gained from data, where accuracy of generalization is the key. In the context of classification, we argue that higher generalization accuracy is achievable via ψ-learning, when a certain class of non-convex rather than convex cost functions are employed. To deliver attainable higher generalization accuracy, we propose two computational strategies via a global optimization technique–difference convex programming, which relies on a decomposition of the cost function into a difference of two convex functions. The first strategy solves sequential quadratic programs. The second strategy, combining this with the method of Branch-and-Bound, is more computationally intensive but is capable of producing global optima. Numerical experiments suggest that the algorithms realize the desired generalization ability of ψ-learning.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Multicategory ψ-Learning and Support Vector Machine: Computational Tools

Many margin-based binary classification techniques such as support vector machine (SVM) andψ-learning deliver high performance. An earlier article proposed a new multicategory ψ-learning methodology that shows great promise in generalization ability. However, ψ-learning is computationally difficult because it requires handling a nonconvex minimization problem. In this article, we propose two co...

متن کامل

Regularization and the small-ball method II: complexity dependent error rates

For a convex class of functions F , a regularization functions Ψ(·) and given the random data (Xi, Yi) N i=1, we study estimation properties of regularization procedures of the form f̂ ∈ argmin f∈F ( 1 N N ∑ i=1 ( Yi − f(Xi) )2 + λΨ(f) ) for some well chosen regularization parameter λ. We obtain bounds on the L2 estimation error rate that depend on the complexity of the “true model” F ∗ := {f ∈ ...

متن کامل

Matheuristics for Ψ-Learning

Recently, the so-called ψ-learning approach, the Support Vector Machine (SVM) classifier obtained with the ramp loss, has attracted attention from the computational point of view. A Mixed Integer Nonlinear Programming (MINLP) formulation has been proposed for ψ-learning, but solving this MINLP formulation to optimality is only possible for datasets of small size. For datasets of more realistic ...

متن کامل

Introduction to Boosting: Origin, Practice and Recent Developments

In this review, we will introduce the audience to the notion of boosting, which has become one of the most successful techniques in machine learning and statistical modeling today. We will review its historical origin in computational learning theory, as well as more recent developments that relate it to other notions in statistics (e.g. gradient boosting), and discuss some recent theoretical d...

متن کامل

Optimizing Ψ-learning via Mixed Integer Programming

As a new margin-based classifier, ψ-learning shows great potential for high accuracy. However, the optimization of ψ-learning involves non-convex minimization and is very challenging to implement. In this article, we convert the optimization of ψ-learning into a mixed integer programming (MIP) problem. This enables us to utilize the state-of-art algorithm of MIP to solve ψ-learning. Moreover, t...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2005